# 768-dimensional embedding

M2 BERT 128 Retrieval Encoder V1
Apache-2.0
M2-BERT-128 is an 80-million-parameter retrieval model checkpoint proposed in the paper 'Benchmarking and Building Long-Context Retrieval Models with LoCo and M2-BERT'
Text Embedding Transformers English
M
hazyresearch
19
3
Bert Base 1024 Biencoder 6M Pairs
A long-context bi-encoder based on MosaicML's pre-trained BERT with 1024 sequence length, designed for generating 768-dimensional dense vector representations of sentences and paragraphs
Text Embedding Transformers Supports Multiple Languages
B
shreyansh26
24
0
Sentence Transformers Gte Base
This is a sentence embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional vector space, suitable for tasks such as semantic search and clustering.
Text Embedding
S
embaas
43
0
Nfcorpus Msmarco Distilbert Gpl
This is a sentence-transformers based model that maps sentences and paragraphs to a 768-dimensional dense vector space, suitable for tasks like clustering or semantic search.
Text Embedding Transformers
N
GPL
439
0
Fever Msmarco Distilbert Gpl
This is a model based on sentence-transformers that can map sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as sentence similarity calculation and semantic search.
Text Embedding Transformers
F
GPL
35
0
Bert Retriever Squad2
This is a sentence embedding model based on sentence-transformers that can convert text into a 768-dimensional vector representation, suitable for tasks such as semantic similarity and text clustering.
Text Embedding Transformers
B
pinecone
36
0
Sentencetransformer Distilbert Hinglish Big
This is a sentence embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional vector space, suitable for clustering and semantic search tasks.
Text Embedding Transformers
S
aditeyabaral
27
0
Paraphrase Albert Small V2
Apache-2.0
This is a sentence transformer model based on the ALBERT-small architecture, capable of mapping sentences and paragraphs into a 768-dimensional vector space, suitable for tasks such as sentence similarity calculation and semantic search.
Text Embedding Transformers
P
DataikuNLP
17
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase